# Mini-batch optimization
Wavlm Base Deepfake V2
A speech processing model fine-tuned based on microsoft/wavLM-base, achieving 99.62% accuracy on the evaluation set
Audio Classification
Transformers

W
DavidCombei
113
0
Wavlm Basic S R 5c 8batch 5sec 0.0001lr Unfrozen
A speech processing model fine-tuned based on microsoft/wavlm-large, achieving 75% accuracy on the evaluation set
Audio Classification
Transformers

W
reralle
16
0
Bert Base Uncased Issues 128
Apache-2.0
A fine-tuned version of the bert-base-uncased model, suitable for specific task processing
Large Language Model
Transformers

B
cj-mills
129
0
Bert Base Uncased Ganesh123
A fine-tuned version based on the BERT base model, specific use case and training data details are currently unknown
Large Language Model
Transformers

B
stevems1
173
0
Roberta Retrained Ru Covid Papers
A Russian model fine-tuned on an unspecified dataset based on roberta-retrained_ru_covid, potentially related to processing COVID-19 research papers
Large Language Model
Transformers

R
Daryaflp
15
0
Python Gpt2 Large Issues 128
Apache-2.0
GPT-2 large model fine-tuned based on bert-base-uncased, specializing in Python code-related issues
Large Language Model
Transformers

P
aytugkaya
15
0
Distilbert Srb Base Cased Oscar
A Serbian language base model based on the DistilBERT architecture, fine-tuned on the OSCAR dataset, suitable for masked language modeling tasks.
Large Language Model
Transformers

D
Aleksandar
13
0
Featured Recommended AI Models